Beyond Backpropagation: Using Simulated Annealing for Training Neural Networks

نویسندگان

  • Robert Dorsey
  • Randall S. Sexton
  • Robert E. Dorsey
  • John D. Johnson
چکیده

The vast majority of neural network research relies on a gradient algorithm, typically a variation of backpropagation, to obtain the weights of the model. Because of the enigmatic nature of complex nonlinear optimization problems, such as training artificial neural networks, this technique has often produced inconsistent and unpredictable results. To go beyond backpropagation’s typical selection of local solutions, simulated annealing is suggested as an alternative training technique that will search globally. In this research, backpropagation will be directly compared with this global search technique via an intensive Monte Carlo study on seven test functions

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Application of Quantum Annealing to Training of Deep Neural Networks

In Deep Learning, a well-known approach for training a Deep Neural Network starts by training a generative Deep Belief Network model, typically using Contrastive Divergence (CD), then fine-tuning the weights using backpropagation or other discriminative techniques. However, the generative training can be time-consuming due to the slow mixing of Gibbs sampling. We investigated an alternative app...

متن کامل

Supervised Training Using Global Search Methods

Supervised learning in neural networks based on the popular backpropagation method can be often trapped in a local minimum of the error function. The class of backpropagation-type training algorithms includes local minimization methods that have no mechanism that allows them to escape the influence of a local minimum. The existence of local minima is due to the fact that the error function is t...

متن کامل

A methodology to training and optimize artificial neural networks weights and connections

This work presents a new methodology that integrates the heuristics tabu search, simulated annealing, genetic algorithms and backpropagation in a prunning and constructive way. The approach obtained promising results in the simultaneous optimization of the artificial neural network architecture and weights of four classification and one prediction problem.

متن کامل

Hybrid PSO-SA algorithm for training a Neural Network for Classification

In this work, we propose a Hybrid particle swarm optimization-Simulated annealing algorithm and present a comparison with i) Simulated annealing algorithm and ii) Back propagation algorithm for training neural networks. These neural networks were then tested on a classification task. In particle swarm optimization behaviour of a particle is influenced by the experiential knowledge of the partic...

متن کامل

Neural Network Prediction in a System for Optimizing Simulations

Neural networks have been widely used for both prediction and classification. Backpropagation is commonly used for training neural networks, although the limitations associated with this technique are well documented. Global search techniques such as simulated annealing, genetic algorithms and tabu search have also been used for this purpose. The developers of these training methods, however, h...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999